13,997 research outputs found

    Science-driven 3D data compression

    Full text link
    Photometric redshift surveys map the distribution of matter in the Universe through the positions and shapes of galaxies with poorly resolved measurements of their radial coordinates. While a tomographic analysis can be used to recover some of the large-scale radial modes present in the data, this approach suffers from a number of practical shortcomings, and the criteria to decide on a particular binning scheme are commonly blind to the ultimate science goals. We present a method designed to separate and compress the data into a small number of uncorrelated radial modes, circumventing some of the problems of standard tomographic analyses. The method is based on the Karhunen-Lo\`{e}ve transform (KL), and is connected to other 3D data compression bases advocated in the literature, such as the Fourier-Bessel decomposition. We apply this method to both weak lensing and galaxy clustering. In the case of galaxy clustering, we show that the resulting optimal basis is closely associated with the Fourier-Bessel basis, and that for certain observables, such as the effects of magnification bias or primordial non-Gaussianity, the bulk of the signal can be compressed into a small number of modes. In the case of weak lensing we show that the method is able to compress the vast majority of the signal-to-noise into a single mode, and that optimal cosmological constraints can be obtained considering only three uncorrelated KL eigenmodes, considerably simplifying the analysis with respect to a traditional tomographic approach.Comment: 14 pages, 11 figures. Comments welcom

    System-Level Design of Energy-Proportional Many-Core Servers for Exascale Computing

    Get PDF
    Continuous advances in manufacturing technologies are enabling the development of more powerful and compact high-performance computing (HPC) servers made of many-core processing architectures. However, this soaring demand for computing power in the last years has grown faster than emiconductor technology evolution can sustain, and has produced as collateral undesirable effect a surge in power consumption and heat density in these new HPC servers, which result on significant performance degradation. In this keynote, I advocate to completely revise the current HPC server architectures. In particular, inspired by the mammalian brain, I propose to design a disruptive three-dimensional (3D) computing server architecture that overcomes the prevailing worst-case power and cooling provisioning paradigm for servers. This new 3D server design champions a new system-level thermal modeling, which can be used by novel proactive energy controllers for detailed heat and energy management in many-core HPC servers, thanks to micro-scale liquid cooling. Then, I will show the impact of new near-threshold computing architectures on server design, and how we can integrate new on-chip microfluidic fuel cell networks to enable energy-scalability in future generations of many-core HPC servers targeting Exascale computing.Universidad de Málaga, Campus de Excelencia Internacional Andalucía Tech

    The variance conjecture on projections of the cube

    Get PDF
    We prove that the uniform probability measure μ\mu on every (nk)(n-k)-dimensional projection of the nn-dimensional unit cube verifies the variance conjecture with an absolute constant CC Varμx2CsupθSn1Eμx,θ2Eμx2,\textrm{Var}_\mu|x|^2\leq C \sup_{\theta\in S^{n-1}}{\mathbb E}_\mu\langle x,\theta\rangle^2{\mathbb E}_\mu|x|^2, provided that 1kn1\leq k\leq\sqrt n. We also prove that if 1kn23(logn)131\leq k\leq n^{\frac{2}{3}}(\log n)^{-\frac{1}{3}}, the conjecture is true for the family of uniform probabilities on its projections on random (nk)(n-k)-dimensional subspaces
    corecore